Essay/Term paper: The history of computers
Essay, term paper, research paper: Information Technology
Free essays available online are good but they will not follow the guidelines of your particular writing assignment. If you need a custom term paper on Information Technology: The History Of Computers, you can hire a professional writer here to write you a high quality authentic essay. While free essays can be traced by Turnitin (plagiarism detection program), our custom written essays will pass any plagiarism test. Our writing service will save you time and grade.
The History of Computers
Whether you know it or not you depend on computers for almost every
thing you do in modern day life. From the second you get up in the morning to
the second you go to sleep computer are tied into what you do and use in some
way. It is tied in to you life in the most obvious and obscure ways. Take for
example you wake up in the morning usually to a digital alarm clock. You start
you car it uses computers the second you turn the key (General Motors is the
largest buyers of computer components in the world). You pick up the phone it
uses computers. No mater how hard you try you can get away from them you can't.
It is inevitable.
Many people think of computers as a new invention, and in reality it is
very old. It is about 2000 years old .1 The first computer was the abacus. This
invention was constructed of wood, two wires, and beads. It was a wooden rack
with the two wires strung across it horizontally and the beads were strung
across the wires. This was used for normal arithmetic uses. These type of
computers are considered analog computers. Another analog computer was the
circular slide rule. This was invented in 1621 by William Oughtred who was an
English mathematician. This slid ruler was a mechanical device made of two rules,
one sliding inside the other, and marked with many number scales. This slide
ruler could do such calculations as division, multiplication, roots, and
logarithms.
Soon after came some more advanced computers. In 1642 came Blaise
Pascal's computer, the Pascaline. It was considered to be the first automatic
calculator. It consisted of gears and interlocking cogs. It was so that you
entered the numbers with dials. It was originally made for his father, a tax
collector.2 Then he went on to build 50 more of these Pascaline's, but clerks
would not uses them.3 They did this in fear that they would loose their jobs.4
Soon after there were many similar inventions. There was the Leibniz
wheel that was invented by Gottfried Leibniz. It got its name because of the way
it was designed with a cylinder with stepped teeth. 5 This did the same
functions of the other computers of its time.
Computers, such as the Leibniz wheel and the Pascaline, were not used
widely until the invention made by Thomas of Colmar (A.K.A Charles Xavier
Thomas).6 It was the first successful mechanical calculator that could do all
the normal arithmetic functions. This type of calculator was improved by many
other inventors so it could do a number of many other things by 1890. The
improvements were they could collect partial results, a memory function (could
store information), and output information to a printer. These improvement were
made for commercial uses mainly, and also required manual installation.
Around 1812 in Cambridge, England, new advancements in computers was
made by Charles Babbage. His idea was that long calculations could be done in a
series of steps the were repeated over many times.7 Ten years later in 1822 he
had a working model and in 1823 he had fabrication of his invention. He had
called his invention the Difference Engine.
In 1833 he had stopped working on his Difference Engine because he had
another idea. It was to Build a Analytical Engine. This would have been a the
first digital computer that would be full program controlled. His invention was
to do all the general- purposes of modern computers. This computer was to use
punch cards for storage, steam power, and operated by one person.8 This
computer was never finished for many reasons. Some of the reasons were not
having precision mechanics and could solve problems not needed to be solved at
that time.9 After Babbage's computer people lost interest in this type of
inventions.10 Eventually inventions afterwards would cause a demand for
calculations capability that computers like Babbage's would capable of doing.
In 1890 an new era of business computing had evolved. This was a
development in punch card use to make a step towards automated computing, which
was first used in 1890 by Herman Holler. Because of this human error was reduced
dramatically.11 Punch Cards could hold 80 charters per card and the machines
could process about 50 -220 cards a minuet. This was a means of easily
accessible me memory of unlimited size.12 In 1896 Hollerith had founded his
company Tabulating Machine Company, but later in 1924 after several mergers and
take-overs International Business Machines (IBM) was formed.
An invention during this time ,1906, would influence the way that
computers were built in the future, it is the first vacuum, and a paper was
wrote by Alan Turingthat described a hypothetical digital computer.13
In 1939 there was the first true digital computer. It was called the ABC,
and was designed by Dr. John Astanasoff.
In 1942 John O. Eckert, John W. Mauchly, and associates had decided to
build a high speed computer. The computer they were to build would become to be
known as the ENIAC (Electrical Numerical Integration And Calculator). The reason
for building this was there was a demand for high computer capacity at the
beginning of World War two.
The ENIAC after being built would take up 1,800 square feet of floor
space.14 It would consist of 18,000 vacuum tubes, and would take up 180,000
watts of power.15 The ENIAC was rated to be 1000 times faster than any other
previous computer. The ENIAC was accepted as the first successful high speed
computer, and was used from 1946 to 1955.16
Around the same time there was a new computer built was more popular. It
was more popular because it not only had the ability to do calculations but it
could also could do the dissension make power of the human brain. When it was
finished in 1950 it became the fastest computer in the world.17 It was built by
the National Bureau of standards on the campus of UCLA. It was names the
National Bureau of Standards Western Automatic Computer or the SWAC. It could be
said that the SWAC set the standards for computers for later up to present
times.18 It was because the had all the same primary units. It had a storage
device, a internal clock, an input output device, and arithmetic logic unit that
consisting of a control and arithmetic unit.
These computers were considered first generation computers (1942 - 1958).
In 1948 John Bardeen, Walter Brattain, and William Schockley of Bell labs
file for the firs patent on the transistor.19 This invention would foundation
for second generation computers (1958 - 1964).
Computers of the second generation were smaller(about the size of a piano
now) and much more quicker because of the new inventions of its time. Computers
used the much smaller transistor over the bulky vacuum tubes. Another invention
which influenced second generation computers and every generation after it was
the discovery of magnetic core memory. Now magnetic tapes and disks were used to
store programs instead of being stored in the computer. This way the computer
could be used for many operations without totally being reprogrammed or rewired
to do another task. All you had to do was pop in another disk.
The third generation(1964 - 1970) was when computers were commercialized
then ever before. This was because they were getting smaller and more
dependable.20 Also the cost went down and power requirements were less.21 This
was probably because of the invention of the silicon semiconductor. These
computers were used in mainly medical places and libraries for keep track of
records and various other reasons. These computer of the third generation were
the first micro computers.
The generation of computers we are in now is the forth generation it
started in 1970. The forth generation really started with an idea by Ted Hoff,
an employ of Intel, that all the processing units of a computer could be placed
on one single chip. This Idea that he had was not bought by many people.22 I
believe that with out this idea upgradeable computers would never have been
designed. Today, every thing has a microprocessor built into it.23
The microcomputer was changed forever in 1976 when Steve Jobs and Steve
Wozniak had sold a Volkswagen and a calculator for $1300 to build the first
Apple.24 The work the did was in their garage. They Had founded their company
1983, and had successfully mad the fortune 500 list.25
Two years before Apple was founded IBM had announced the release of the
IBM PC. Over the next 18 months the IBM would become an industry standard.26
From the 1980 on there was a was a large demand for microcomputers Suck
as the IBM PC and Apple not only in industry but in the homes of many people.
Many other computers appeared during the 80's. Some were the Commodore, Tandy,
Atari, and game systems such as the nintendo and many others. There was aslo a
large demand for computer games for the home PC. Because of these many demands
many companies were getting very competitive. They were pushing for the faster
better computer. Buy the late 80's because of this demand microprocessors could
handle 32 bits of data at a time pushing over 4 million instructions processed a
second.27
It seem as if over time computers have evolved in to totally different
machines but if you put it in to perspective they are also much alike. But on
the other hand With almost every business and many families today are in demand
of better and newer computers it seems that if you buy a new computer today
industry had made it obsolete before you it. This is probably because the
better you make a computer and quicker it can do calculations the quicker it can
help you in designing an new computer that is even faster. It is a domino effect
that was started back 2000 years ago and will probably never end. Who knows
what's in store for the future or you could say the fifth generation of
computers.